DTE AICCOMAS 2025

A Metriplectic Transformer Architecture for Dissipative Dynamical Systems

  • Urdeitx, Pau (Universidad de Zaragoza)
  • Chinesta, Francisco (ENSAM Institute of Technology)
  • Cueto, Elías (Universidad de Zaragoza)

Please login to view abstract download link

The use of deep neural networks for learning dynamic systems allows a novel approach to the description of different physical systems. Although the learning of these networks is fundamentally based on data, several authors have developed techniques and architectures that have allowed the incorporation of a physical bias during learning, generally, observing an improvement in the performance of the network [1]. In many dynamic systems the real physical system often needs to incorporate phenomenological variables, that describe internal aspects of the system that depend on the previous history. Different architectures, such as RNN, CNN, and LSTM, allow the definition of this historic-memory in networks with which the evolution of the system can be defined as dependent on the previous history. However, this memory in all these cases is localized and limited. In this sense, Transformers are capable to manage long-term dependencies showing a huge potential to learn the evolution of the systems with history-dependent behaviour [2]. We have developed a new Transformer architecture to learn these phenomenological variables. We incorporate physical consistency, the physics-bias, by imposing a metriplectic architecture in the net based on the GENERIC formalism [3]. We studied the reconstruction of different dynamic systems with the purposed Structure Preserving Transformer architecture. Through the attention mechanism, the evolution of the system depends on the previous history, incorporating in the architecture the contribution of these phenomenological variables. [1] Cicirello, A. (2024). Physics-Enhanced Machine Learning: a position paper for dynamical systems investigations. http://arxiv.org/abs/2405.05987. [2] Geneva, N., & Zabaras, N. (2022). Transformers for modeling physical systems. Neural Networks, 146, 272–289. https://doi.org/10.1016/j.neunet.2021.11.022. [3] Hernández, Q., Badías, A., González, D., Chinesta, F., & Cueto, E. (2021). Structure-preserving neural networks. Journal of Computational Physics, 426, 109950. https://doi.org/10.1016/j.jcp.2020.109950.